Job Title: Java Developer with Databricks, Scalar and AWS (Only w2)
Location: Remote (Candidate should work as per MST-AZ Timing)
Minimum 9+ Years of Experience required.
Must Have Skills: Scalar, AWS-S3, Databricks, Parquet, Webservices (SOAP/Rest), Python.
Job Description: Backend Java Developer with AWS & Databricks Expertise
We are seeking a Backend Java Developer with a strong background in building scalable, reliable, and high-performance applications. The ideal candidate will possess expertise in Java web services (REST/SOAP API development) and have hands-on experience with AWS services, Databricks, and data engineering tools. Familiarity with Scalar, Parquet file formats is must and Python is a plus.
Key Responsibilities
- API Development: Design, develop, and maintain RESTful and SOAP APIs using Java.
- Backend Development: Build, enhance, and maintain backend systems and services, ensuring high performance, scalability, and security.
- Cloud Integration: Work with AWS cloud technologies, specifically S3 for data storage and integration.
- Experience with Databricks and Scalar for data engineering tasks, including building and maintaining data pipelines.
- Data Formats: Work with Parquet file formats for efficient data storage and retrieval.
- Collaboration: Collaborate with front-end developers, data engineers, and business stakeholders to ensure smooth integration of backend systems and data pipelines.
- Code Optimization: Write clean, well-documented code, and optimize backend systems for better performance and reliability.
- Version Control: Use Git for version control, ensuring smooth collaboration across teams.
- Testing: Develop and execute unit and integration tests to ensure the reliability and robustness of API services.
- Support & Troubleshooting: Provide support for production systems, resolve issues, and enhance performance as needed.
Required Skills and Qualifications
- Proven Experience as a Java API Developer, with a strong background in Java web services (REST/SOAP API development).
- Hands-on experience with AWS technologies, specifically S3 for object storage, and other AWS services for cloud infrastructure.
- Experience with Databricks for data engineering tasks, including building and maintaining data pipelines.
- Knowledge of Parquet file formats for efficient data storage and processing.
- Solid understanding of backend development principles, including performance tuning, API security, and cloud architecture.
- Experience with Python is a plus, particularly in data manipulation, automation, or scripting tasks.
- Strong problem-solving skills and the ability to debug and troubleshoot complex systems.
- Experience with version control systems like Git.
- Strong communication skills and the ability to collaborate effectively in a team environment.
If I missed your call ! Please drop me a mail.
Thank you,
Harish
Talent Acquisition
Astir IT Solutions, Inc - An E-Verified Company
Email:harishj@astirit.com
Direct : 7326946000*788